Sufficient dimension reduction via bayesian mixture modeling.
نویسندگان
چکیده
Dimension reduction is central to an analysis of data with many predictors. Sufficient dimension reduction aims to identify the smallest possible number of linear combinations of the predictors, called the sufficient predictors, that retain all of the information in the predictors about the response distribution. In this article, we propose a Bayesian solution for sufficient dimension reduction. We directly model the response density in terms of the sufficient predictors using a finite mixture model. This approach is computationally efficient and offers a unified framework to handle categorical predictors, missing predictors, and Bayesian variable selection. We illustrate the method using both a simulation study and an analysis of an HIV data set.
منابع مشابه
Supervised Dimension Reduction Using Bayesian Mixture Modeling
We develop a Bayesian framework for supervised dimension reduction using a flexible nonparametric Bayesian mixture modeling approach. Our method retrieves the dimension reduction or d.r. subspace by utilizing a dependent Dirichlet process that allows for natural clustering for the data in terms of both the response and predictor variables. Formal probabilistic models with likelihoods and priors...
متن کاملHandling Missing Data with Variational Bayesian Learning of ICA
Missing data is common in real-world datasets and is a problem for many estimation techniques. We have developed a variational Bayesian method to perform Independent Component Analysis (ICA) on high-dimensional data containing missing entries. Missing data are handled naturally in the Bayesian framework by integrating the generative density model. Modeling the distributions of the independent s...
متن کاملOn the consistency of coordinate-independent sparse estimation with BIC
Chen et al. (2010) propose a unified method – coordinate-independent sparse estimation (CISE) – that is able to simultaneously achieve sparse sufficient dimension reduction and screen out irrelevant and redundant variables efficiently. However, its attractive features depend on appropriate choice of the tuning parameter. In this note, we re-examine the Bayesian information criterion (BIC) in su...
متن کاملA note on shrinkage sliced inverse regression
We employ Lasso shrinkage within the context of sufficient dimension reduction to obtain a shrinkage sliced inverse regression estimator, which provides easier interpretations and better prediction accuracy without assuming a parametric model. The shrinkage sliced inverse regression approach can be employed for both single-index and multiple-index models. Simulation studies suggest that the new...
متن کاملCopula based factorization in Bayesian multivariate infinite mixture models
Bayesian nonparametric models based on infinite mixtures of density kernels have been recently gaining in popularity due to their flexibility and feasibility of implementation even in complicated modeling scenarios. However, these models have been rarely applied in more than one dimension. Indeed, implementation in the multivariate case is inherently difficult due to the rapidly increasing numb...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Biometrics
دوره 67 3 شماره
صفحات -
تاریخ انتشار 2011